Discrimination Measure of Correlations in a Population of Neurons by Using the Jensen-Shannon Divergence

نویسندگان

  • F. Montani
  • O. A. Rosso
  • S. R. Schultz
چکیده

The significance of synchronized spikes fired by nearby neurons for perception is still unclear. To evaluate how reliably one can decide if a given response on the population coding of sensory information comes from the full distribution, or from the product of independent distributions from each cell, we used recorded responses of pairs of single neurons in primary visual cortex of macaque monkey (VI) to stimuli of varying orientation. Both trial-to-trial variability and synchrony were found to depend stimulus orientation and contrast in this data set (A. Kohn, and M. A Smith, J. Neurosci. 25 (2005) 3661). We used the Jensen-Shannon Divergence for fixed stimuli as a measure of discrimination between a pairs of correlated cells VI. The Jensen-Shannon divergence, can be consider as a measure distance between the corresponding probability distribution function associated with each spikes fired observed patterns. The Nemenman-Shafee-Bialek estimator was used in our entropy estimation in order to remove all possible bias deviation from our calculations. We found that the relative Jensen-Shannon Divergence (measured in relation to case in which all cell fired completely independently) decreases with respect to the difference in orientation preference between the receptive field from each pair of cells. Our finding indicates that the Jensen-Shannon Divergence may be used for characterizing the effective circuitry network in a population of neurons.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....

متن کامل

Seven Means, Generalized Triangular Discrimination, and Generating Divergence Measures

Jensen-Shannon, J-divergence and Arithmetic-Geometric mean divergences are three classical divergence measures known in the information theory and statistics literature. These three divergence measures bear interesting inequality among the three non-logarithmic measures known as triangular discrimination, Hellingar’s divergence and symmetric chi-square divergence. However, in 2003, Eve studied ...

متن کامل

A Graph Embedding Method Using the Jensen-Shannon Divergence

Riesen and Bunke recently proposed a novel dissimilarity based approach for embedding graphs into a vector space. One drawback of their approach is the computational cost graph edit operations required to compute the dissimilarity for graphs. In this paper we explore whether the Jensen-Shannon divergence can be used as a means of computing a fast similarity measure between a pair of graphs. We ...

متن کامل

Comparison of Redundancy and Relevance Measures for Feature Selection in Tissue Classification of CT Images

In this paper we report on a study on feature selection within the minimum–redundancy maximum–relevance framework. Features are ranked by their correlations to the target vector. These relevance scores are then integrated with correlations between features in order to obtain a set of relevant and least–redundant features. Applied measures of correlation or distributional similarity for redunanc...

متن کامل

A Sequence of Inequalities among Difference of Symmetric Divergence Measures

In this paper we have considered two one parametric generalizations. These two generalizations have in particular the well known measures such as: J-divergence, Jensen-Shannon divergence and arithmetic-geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric χ2−divergence, and trian...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007